2,820 research outputs found
Improving Neural Parsing by Disentangling Model Combination and Reranking Effects
Recent work has proposed several generative neural models for constituency
parsing that achieve state-of-the-art results. Since direct search in these
generative models is difficult, they have primarily been used to rescore
candidate outputs from base parsers in which decoding is more straightforward.
We first present an algorithm for direct search in these generative models. We
then demonstrate that the rescoring results are at least partly due to implicit
model combination rather than reranking effects. Finally, we show that explicit
model combination can improve performance even further, resulting in new
state-of-the-art numbers on the PTB of 94.25 F1 when training only on gold data
and 94.66 F1 when using external data.Comment: ACL 2017. The first two authors contributed equall
Abstract Syntax Networks for Code Generation and Semantic Parsing
Tasks like code generation and semantic parsing require mapping unstructured
(or partially structured) inputs to well-formed, executable outputs. We
introduce abstract syntax networks, a modeling framework for these problems.
The outputs are represented as abstract syntax trees (ASTs) and constructed by
a decoder with a dynamically-determined modular structure paralleling the
structure of the output tree. On the benchmark Hearthstone dataset for code
generation, our model obtains 79.2 BLEU and 22.7% exact match accuracy,
compared to previous state-of-the-art values of 67.1 and 6.1%. Furthermore, we
perform competitively on the Atis, Jobs, and Geo semantic parsing datasets with
no task-specific engineering.Comment: ACL 2017. MR and MS contributed equall
- …